Generalized Kullback-Leibler information and its extensions to censored and discrete cases

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

From information theoretic dualities to Path Integral and Kullback Leibler control: Continuous and Discrete Time formulations

This paper presents a unified view of stochastic optimal control theory as developed within the machine learning and control theory communities. In particular we show the mathematical connection between recent work on Path Integral (PI) and Kullback Leibler (KL) divergence stochastic optimal control theory with earlier work on risk sensitivity and the fundamental dualities between free energy a...

متن کامل

Kullback – Leibler Aggregation and Misspecified Generalized Linear Models

In a regression setup with deterministic design, we study the pure aggregation problem and introduce a natural extension from the Gaussian distribution to distributions in the exponential family. While this extension bears strong connections with generalized linear models, it does not require identifiability of the parameter or even that the model on the systematic component is true. It is show...

متن کامل

Testing Normality Based on Kullback-Leibler Information With Progressively Type-II Censored Data

We will use the joint entropy of progressively censored order statistics in terms of an incomplete integral of the hazard function, and provide a simple estimate of the joint entropy of progressively Type-II censored data, has been introduced by Balakrishnan et al. (2007). Then We construct a goodness-of-fit test statistic based on Kullback-Leibler information for normal distribution. Finally, ...

متن کامل

Optimal Kullback-Leibler Aggregation via Information Bottleneck

In this paper, we present a method for reducing a regular, discrete-time Markov chain (DTMC) to another DTMC with a given, typically much smaller number of states. The cost of reduction is defined as the Kullback–Leibler divergence rate between a projection of the original process through a partition function and a DTMC on the correspondingly partitioned state space. Finding the reduced model w...

متن کامل

Image Recognition Using Kullback-Leibler Information Discrimination

The problem of automatic image recognition based on the minimum information discrimination principle is formulated and solved. Color histograms comparison in the Kullback–Leibler information metric is proposed. It’s combined with method of directed enumeration alternatives as opposed to complete enumeration of competing hypotheses. Results of an experimental study of the Kullback-Leibler discri...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of the Korean Data and Information Science Society

سال: 2012

ISSN: 1598-9402

DOI: 10.7465/jkdi.2012.23.6.1223